Gaussian Mixure Models and Expectation Maximization

ثبت نشده
چکیده

The goal of the assignment is to use the Expectation Maximization (EM) algorithm to estimate the parameters of a two-component Guassian Mixture in two dimensions. This involves estimating the mean vector μk and covariance matrix Σk for both distributions as well as the mixing coefficients (or prior probabilities) πk for each component k. EM works by first choosing an arbitrary parameter set. In my implementation, I sampled the initial means from a zero-mean unit variance normal distribution, and the covariance matrices were initially set to the identity. I set π1 to a random value between zero and one and set π2 = 1− π1. EM proceeds by estimating the ”responsibilities” of each component for each data point in the E-step. The means, covariance, and mixing coefficients for both components are then re-estimated using the data weighted by the responsibility values in the M-step. This process alternates until a halting criterion is reached. I used the change in the log likelihood as my halting criterion and ran the algorithm until the change in log likelihood was less than = 0.005.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Identification of Nonlinear Systems using Gaussian Mixture of Local Models

Identification of operating regime based models of nonlinear dynamic systems is addressed. The operating regimes and the parameters of the local linear models are identified directly and simultaneously based on the Expectation Maximization (EM) identification of Gaussian Mixture Model (GMM). The proposed technique is demonstrated by means of the identification of a neutralization reaction in a ...

متن کامل

Mixture Models and the Segmentation of Multimodal Textures

A problem of using mixture-of-Gaussian models for unsupervised texture segmentation is that “multimodal” textures (such as can often be encountered in natural images) cannot be well represented by a single Gaussian cluster. We propose a divide-andconquer method that groups together Gaussian clusters (estimated via Expectation Maximization) into homogeneous texture classes. This method allows to...

متن کامل

EM algorithms for multivariate Gaussian mixture models with truncated and censored data

We present expectation-maximization(EM) algorithms for fitting multivariate Gaussian mixture models to data that is truncated, censored or truncated and censored. These two types of incomplete measurements are naturally handled together through their relation to the multivariate truncated Gaussian distribution. We illustrate our algorithms on synthetic and flow cytometry data.

متن کامل

Spoken Language Identification for Indian Languages Using Split and Merge EM Algorithm

Performance of Language Identification (LID) System using Gaussian Mixture Models (GMM) is limited by the convergence of Expectation Maximization (EM) algorithm to local maxima. In this paper an LID system is described using Gaussian Mixture Models for the extracted features which are then trained using Split and Merge Expectation Maximization Algorithm that improves the global convergence of E...

متن کامل

Pii: S0031-3203(01)00133-9

We present a novel method for representing “extruded” distributions. An extruded distribution is an M -dimensional manifold in the parameter space of the component distribution. Representations of that manifold are “continuous mixture models”. We present a method for forming one-dimensional continuous Gaussian mixture models of sampled extruded Gaussian distributions via ridges of goodness-of-#...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010